| 1. | Neural network with linear - nonlinear combined output nodes 线性与非线性输出单元相结合的神经网络 |
| 2. | Where j varies over all the output nodes that receive input from n 这里每个从n接收输入的输出节点j都不同。 |
| 3. | Step 3 : output nodes calculate their outputs on the basis of step 2 第3步:输出节点在第2步的基础上计算它们的输出。 |
| 4. | Output nodes are also used to direct certain messages to different nodes and queue 输出节点也能够用于将某些消息引导到不同的节点和队列中。 |
| 5. | On the one hand , the more n influences an output node , the more n affects the net s overall error 一方面, n影响一个输出节点越多, n造成网络整体的误差也越多。 |
| 6. | However , n almost always influences more than one output node , and it may influence every output node . so , d is 但是n几乎总是影响多个输出节点,也许会影响每一个输出结点,这样, d ( n )可以表示为: |
| 7. | On the other hand , if the output node influences the overall error less , then n s influence correspondingly diminishes 另一方面,如果输出节点影响网络整体的误差越少, n对输出节点的影响也相应减少。 |
| 8. | The glitch may , in turn , reduce the relative magnitude , causing the output node of the input inverter to switch states 此干扰脉冲反过来又减小(输入信号)的相对幅度,造成输入反相器的输出节点改变状态。 |
| 9. | A more interesting difference is a pair of intermediary nodes , n1 and n2 , and an increase in the number of output nodes from two to four , o1 - o4 一个更有意思的区别是出现了一对中间节点, n1和n2 ,以及输出节点数量从两个变成了四个( o1到o4 ) 。 |
| 10. | Sum w , for all j where j is an output node that takes input from n . putting this together gives us a training rule 这里j是一个从n获得输入的输出节点,联系起来,我们就得到了一个培训规则,第1部分:在隐藏节点n和输出节点j之间权系数改变,如下所示: |